A Training Algorithm for Optimal Margin Classiiers

نویسندگان

  • Bernhard E. Boser
  • Isabelle M. Guyon
چکیده

A training algorithm that maximizes the margin between the training patterns and the decision boundary is presented. The technique is applicable to a wide variety of classiiac-tion functions, including Perceptrons, polyno-mials, and Radial Basis Functions. The effective number of parameters is adjusted automatically to match the complexity of the problem. The solution is expressed as a linear combination of supporting patterns. These are the subset of training patterns that are closest to the decision boundary. Bounds on the generalization performance based on the leave-one-out method and the VC-dimension are given. Experimental results on optical character recognition problems demonstrate the good generalization obtained when compared with other learning algorithms.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Pac-bayesian Margin Bound for Linear Classiiers: Why Svms Work

We present a bound on the generalisation error of linear classiiers in terms of a reened margin quantity on the training set. The result is obtained in a PAC-Bayesian framework and is based on geometrical arguments in the space of linear classiiers. The new bound constitutes an exponential improvement of the so far tightest margin bound by Shawe-Taylor et al. 8] and scales logarithmically in th...

متن کامل

Bayesian Classifiers Are Large Margin Hyperplanes in a Hilbert Space

Bayesian algorithms for Neural Networks are known to produce classiiers which are very resistant to overrtting. It is often claimed that one of the main distinctive features of Bayesian Learning Algorithms is that they don't simply output one hypothesis, but rather an entire distribution of probability over an hypothesis set: the Bayes posterior. An alternative perspective is that they output a...

متن کامل

Bayesian Classiiers Are Large Margin Hyperplanes in a Hilbert Space Produced as Part of the Esprit Working Group in Neural and Computational Learning Ii, Neurocolt2 27150

Bayesian algorithms for Neural Networks are known to produce clas-siiers which are very resistant to overrtting. It is often claimed that one of the main distinctive features of Bayesian Learning Algorithms is that they don't simply output one hypothesis, but rather an entire distribution of probability over an hypothesis set: the Bayes posterior. An alternative perspective is that they output ...

متن کامل

Boosting Algorithms as Gradient Descent in Function Space

Much recent attention, both experimental and theoretical, has been focussed on classii-cation algorithms which produce voted combinations of classiiers. Recent theoretical work has shown that the impressive generalization performance of algorithms like AdaBoost can be attributed to the classiier having large margins on the training data. We present abstract algorithms for nding linear and conve...

متن کامل

Query Learning with Large Margin Classi ersColin

The active selection of instances can sig-niicantly improve the generalisation performance of a learning machine. Large margin classiiers such as support vector machines classify data using the most informative instances (the support vectors). This makes them natural candidates for instance selection strategies. In this paper we propose an algorithm for the training of support vector machines u...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1992